Top related persons:
Top related locs:
Top related orgs:

Search resuls for: "Internet Watch Foundation"


5 mentions found


New York CNN —More than a thousand images of child sexual abuse material were found in a massive public dataset used to train popular AI image-generating models, Stanford Internet Observatory researchers said in a study published earlier this week. The presence of these images in the training data may make it easier for AI models to create new and realistic AI-generated images of child abuse content, or “deepfake” images of children being exploited. The massive dataset that the Stanford researchers examined, known as LAION 5B, contains billions of images that have been scraped from the internet, including from social media and adult entertainment websites. Of the more than five billion images in the dataset, the Stanford researchers said they identified at least 1,008 instances of child sexual abuse material. “Stability AI models were trained on a filtered subset of that dataset.
Persons: ” LAION Organizations: New, New York CNN, Stanford Internet, Stanford, Internet Watch, National Center for, Canadian Centre for Child, CNN, Stability Locations: New York, London
[1/2] TikTok app logo is seen in this illustration taken, August 22, 2022. REUTERS/Dado Ruvic/Illustration/File Photo Acquire Licensing RightsLONDON, Oct 30 (Reuters) - Tech firms including TikTok, Snapchat and Stability AI have signed a joint statement pledging to work together to counter child sex abuse images generated by artificial intelligence. "We resolve to sustain the dialogue and technical innovation around tackling child sexual abuse in the age of AI," the statement read. "We resolve to work together to ensure that we utilise responsible AI for tackling the threat of child sexual abuse and commit to continue to work collaboratively to ensure the risks posed by AI to tackling child sexual abuse do not become insurmountable." Britain cited data from the Internet Watch Foundation showing that in one dark web forum users had shared nearly 3,000 images AI generated child sexual abuse material.
Persons: Dado Ruvic, Susie Hargreaves, William James, Sarah Young Organizations: REUTERS, Tech, TikTok, Britain, Internet Watch Foundation, Thomson Locations: United States, Britain
NEW YORK (AP) — The already-alarming proliferation of child sexual abuse images on the internet could become much worse if something is not done to put controls on artificial intelligence tools that generate deepfake photos, a watchdog agency warned on Tuesday. In a written report, The U.K.-based Internet Watch Foundation urges governments and technology providers to act quickly before a flood of AI-generated images of child sexual abuse overwhelms law enforcement investigators and vastly expands the pool of potential victims. If it isn’t stopped, the flood of deepfake child sexual abuse images could bog investigators down trying to rescue children who turn out to be virtual characters. “That is just incredibly shocking.”Sexton said his charity organization, which is focused on combating online child sexual abuse, first began fielding reports about abusive AI-generated imagery earlier this year. It particularly targets the European Union, where there's a debate over surveillance measures that could automatically scan messaging apps for suspected images of child sexual abuse even if the images are not previously known to law enforcement.
Persons: “ We're, , Dan Sexton, , isn’t, Sexton, who’ve, , ” Sexton, they're, David Thiel, Kamala Harris, Susie Hargreaves, ” ___ O'Brien, Barbara Ortutay, Kim Organizations: Internet Watch Foundation, Court, IWF, European Union, Technology, Stanford Internet Observatory, U.S, Associated Press Locations: South Korea, Busan, Spain, London, Providence , Rhode Island, Oakland , California, Seoul
Crypto exchanges enabled online child sex-abuse profiteer
  + stars: | 2022-11-23 | by ( ) www.reuters.com   time to read: +22 min
These sites often included links for users to pay via crypto exchanges, the IWF told Reuters, declining to name companies. “For those people looking to make money from child sexual abuse, crypto has lowered the barrier,” said Dan Sexton, the IWF’s chief technology officer. The Dark Scandals website, owned by Michael Mohammad, instructs users to send tokens to a Dark Scandals digital wallet to purchase content. While banks and payment platforms demanded more details from online merchants, many crypto exchanges for years requested little or no information from clients. The IWF received more reports last year of websites selling child abuse imagery for crypto than any year prior.
The U.S. Justice Department, in a report this September, said many crypto exchanges still "make little or no effort to comply" with know-your-customer requirements. These sites often included links for users to pay via crypto exchanges, the IWF told Reuters, declining to name companies. While banks and payment platforms demanded more details from online merchants, many crypto exchanges for years requested little or no information from clients. Asked at his trial for his opinion of crypto, Mohammad noted, "Privacy is something that a lot of users value." The IWF received more reports last year of websites selling child abuse imagery for crypto than any year prior.
Total: 5